Newer cards are also probably going to generate a lot more heat than older ones. It seems every time I upgrade to a better graphics card my temperature sensors register quite a nice jump in temperature... GW will run at ~40C on my Linux box with a GeForce 6200 OC, I believe it ran at ~50C when it used a Radeon X700 Pro, and my laptop's 8600m GT more or less idles around 70-75C, averages ~80-90C with GW, and if it happens to near 95-100C I decide to take a break for awhile.
It's pretty much the same with all new components that come out these days - the more powerful they become, the more heat they tend to generate. Honestly I usually just end up playing in a window to keep the temperature around 80. :/ Back when I posted here regularly a few years ago 90C was the general "oh snap" temperature (if I remember correctly) and that still seems pretty sensible when applied to today's video cards, so I'd have to agree with Nanood. Some high-end cards are designed to run hotter than others, but I'm still not very keen at running over 200 degrees Fahrenheit for extended periods of time. :P
|